Azure Percept: A machine learning quick starter | InfoWorld

2022-05-21 23:05:45 By : Ms. Jing Wang

By Simon Bisson , Columnist, InfoWorld |

Microsoft’s commitment to low-code and no-code application development goes a lot further than its Power Platform. The same connector and pipeline model powers its Azure Logic Apps platform and elements of the Azure Machine Learning studio. Connecting prebuilt elements together may not have the flexibility of developing your own applications from scratch, but it’s a quick way to deliver value. At the same time, it’s a way to bring in nontraditional development skills that can add missing knowledge.

One area where there’s a disconnect between application development and the physical world is health and safety. People are unpredictable, making it hard to design applications that can help identify potential dangers on the shop floor or around machinery. One option is to use computer vision–based machine learning to build models of normal behavior that allow anomalies to be quickly identified. A camera monitoring a set of gas pumps can be trained to identify someone smoking; a camera by a hydraulic press can be trained to monitor when an operator or passerby steps out of the safe space.

The question is how to build and deploy a safety-oriented machine learning system quickly? That’s where Microsoft’s Azure Percept platform comes in, a focused version of its Azure Internet of Things edge platform combined with a set of hardware specifications and a cloud-hosted, low-code application development environment with a containerized deployment model. It offers an industry-standard mounting-based developer kit so you can build and test applications before deploying them to onsite systems. It uses the familiar 80/20 mounting rails used for much industrial electronic equipment so it will be compatible with existing mounts and power distribution systems, keeping costs to a minimum.

Microsoft has done a lot to make its Azure Cognitive Services portable, delivering containerized runtimes that let you use edge hardware for inferencing instead of sending data to centralized Azure resources. This approach helps save on bandwidth costs, allowing you to deliver a much smaller set of results to your applications rather than sending gigabytes of streamed images. Edge sites are often bandwidth constrained, so using this approach allows you to run machine learning–based applications where they’re needed, not where there’s available bandwidth.

Getting started requires the relatively low-cost Azure Percept DK, currently selling for $349. It comes in two parts: an edge compute unit and a smart camera. A third component, a smart microphone, is available for audio-based prediction applications, such as monitoring motors for signs of possible failure. The edge compute system is based on an NXP Arm system, running Microsoft’s own CBL-Mariner Linux distribution, and the camera uses an Intel Movidius dedicated computer vision system. Both are designed to get you going quickly. Microsoft suggests you can go from “out-of-the-box to first AI frames in under 10 minutes.”

Applications are developed in the cloud-based Azure Percept Studio, with a selection of prebuilt models. If you’re familiar with Microsoft’s Cognitive Services tools, you can also use the Azure Machine Learning studio or a local development environment using Visual Studio Code. The local toolkit is based on Python and TensorFlow, with Intel’s OpenVINO to support the Movidius vision processor. Other deployment environments, such as Nvidia’s, are supported so you build your own cameras using Jetson or work with third-party vendors to add their hardware to a Percept deployment. Tools can be downloaded as a single dev pack, building out a ready-to-use environment on Windows, macOS, or Linux.

The built-in models are enough to get started, as they cover most common industrial Internet of Things vision scenarios. As well as detecting people and vehicles, there’s even a model to detect products on shelves. Along with an object detection model, this could give you a set of tools to quickly put together a basic stock-level tracker for monitoring consumables or ensuring that spare parts are available.

You can take advantage of ready-to-run solutions like a people counter. This sample uses the camera to count the number of people in a designated area, delivering the response to an Azure data store and using a web application to display images along with count data. Although it’s not particularly useful on its own, it’s a good way to experiment with the Percept hardware, learning both how well it performs and the types of data it can deliver to your own solutions. Perhaps you’re operating a busy space that needs monitoring to ensure compliance with licensing or fire regulations, or maybe you want to get a feel for flow through common areas or how long people wait for elevators.

Building your first application is quick and simple. An Azure subscription is essential, as you’ll be using a Cognitive Services container to host and run your model. This does mean that once the platform comes out of preview it will have usage costs, but for now it’s free. The Azure Percept Dev Kit is treated as a device attached to an Azure IoT Hub. (You can create a new Hub or connect to an existing resource.) The device itself is connected to your wireless network and configured using its own built-in web server. For more detailed management, a device console is accessible over Secure Shell.

If you’re using the general computer vision model, you can train it much like training the custom Computer Vision cognitive services model. Simply take a series of pictures using the Azure Percept Studio and label them appropriately before training the model and evaluating its performance. Once trained, you can pick a version to deploy from the Studio to your device, using the web portal to test the model against a live stream from the Percept camera. Models can be retrained with additional data, using probability data from your stream to refine the images for retraining.

More complex solutions built using Visual Studio Code and TensorFlow can be managed using GitHub to host a container registry to deploy ready-to-run models to Percept devices. Models have standard endpoints that can be used in other applications, either in custom code or as a low-code endpoint for Power Apps or Power Automate. For example, you can use the output from Percept as the input to a Stream Analytics job, with each detection a message that can be managed using familiar Azure tools.

Tools like Azure Percept bridge the gap between hardware and software, giving you simple ways to manage and develop your own intelligent hardware. Azure Percept gives you access to affordable hardware that’s ready to work with containerized machine learning models, with a focus on a limited set of scenarios. When Microsoft talks about “the intelligent edge,” it’s talking about devices like this. Devices become microservices, and starting with no-code development leads directly to building your own custom machine learning models, going from beginner to expert on your own schedule.

Author of InfoWorld's Enterprise Microsoft blog, Simon Bisson has worked in academic and telecoms research, been the CTO of a startup, run the technical side of UK Online, and done consultancy and technology strategy.

Copyright © 2022 IDG Communications, Inc.

Copyright © 2022 IDG Communications, Inc.